Make _request response more useful when stream: true #6
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
(On top of #10)
This PR is created to discuss possible
OpenAI\_request
refactoring.What's changed:
return synthetic response object, similar to one returned when
stream: false
ChatSession\generate_response
, as there is no longer a need to re-parse the response stringsReasoning:
Use Case: The callback function primarily serves to display streaming output to the user. However, after this process, it becomes simpler to handle the integrated data.
The 1st one is most important, and I consider it as necessity. And I would like to know your opinion on the 2nd one.
To be done:
finish_reason
,usage
, etc.